# Autoregressive architecture
Janus Pro 1B
MIT
Janus-Pro is a novel autoregressive framework that unifies multi-modal understanding and generation tasks and enhances flexibility by decoupling visual encoding.
Text-to-Image
Transformers

J
deepseek-community
4,636
0
Codellama 13b Python Hf
Code Llama is a series of pre-trained and fine-tuned generative text models released by Meta with parameter scales ranging from 7 billion to 34 billion. This model is the 13-billion-parameter Python specialized version
Large Language Model
Transformers Other

C
meta-llama
636
7
Codellama 34b Python Hf
Code Llama is a 34-billion-parameter Python-specific code generation model developed by Meta, optimized based on the Llama 2 architecture, focusing on Python code synthesis and understanding
Large Language Model
Transformers Other

C
codellama
2,135
96
Gpt Neo 2.7B
MIT
GPT-Neo 2.7B is a 2.7 billion parameter Transformer language model replicated by EleutherAI based on the GPT-3 architecture, trained on the Pile dataset
Large Language Model English
G
EleutherAI
52.68k
486
Featured Recommended AI Models